Divergence measures and a general framework for local variational approximation
نویسندگان
چکیده
The local variational method is a technique to approximate an intractable posterior distribution in Bayesian learning. This article formulates a general framework for local variational approximation and shows that its objective function is decomposable into the sum of the Kullback information and the expected Bregman divergence from the approximating posterior distribution to the Bayesian posterior distribution. Based on a geometrical argument in the space of approximating posteriors, we propose an efficient method to evaluate an upper bound of the marginal likelihood. Moreover, we demonstrate that the variational Bayesian approach for the latent variable models can be viewed as a special case of this general framework.
منابع مشابه
A Relaxed Extra Gradient Approximation Method of Two Inverse-Strongly Monotone Mappings for a General System of Variational Inequalities, Fixed Point and Equilibrium Problems
متن کامل
Variational Inference for Bayesian Mixtures of Factor Analysers
We present an algorithm that infers the model structure of a mixture of factor analysers using an efficient and deterministic variational approximation to full Bayesian integration over model parameters. This procedure can automatically determine the optimal number of components and the local dimensionality of each component (Le. the number of factors in each factor analyser) . Alternatively it...
متن کاملAlpha-Divergences in Variational Dropout
We investigate the use of alternative divergences to Kullback-Leibler (KL) in variational inference(VI), based on the Variational Dropout [10]. Stochastic gradient variational Bayes (SGVB) [9] is a general framework for estimating the evidence lower bound (ELBO) in Variational Bayes. In this work, we extend the SGVB estimator with using Alpha-Divergences, which are alternative to divergences to...
متن کاملVariational Particle Approximations
Monte Carlo methods provide a powerful framework for approximating probability distributions with a set of stochastically sampled particles. In this paper, we rethink particle approximations from the perspective of variational inference, where the particles play the role of variational parameters. This leads to a deterministic version of Monte Carlo in which the particles are selected to optimi...
متن کاملA robust variational approach for simultaneous smoothing and estimation of DTI
Estimating diffusion tensors is an essential step in many applications - such as diffusion tensor image (DTI) registration, segmentation and fiber tractography. Most of the methods proposed in the literature for this task are not simultaneously statistically robust and feature preserving techniques. In this paper, we propose a novel and robust variational framework for simultaneous smoothing an...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Neural networks : the official journal of the International Neural Network Society
دوره 24 10 شماره
صفحات -
تاریخ انتشار 2011